43 research outputs found

    Oceanids C2: An Integrated Command, Control, and Data Infrastructure for the Over-the-Horizon Operation of Marine Autonomous Systems

    Get PDF
    Long-range Marine Autonomous Systems (MAS), operating beyond the visual line-of-sight of a human pilot or research ship, are creating unprecedented opportunities for oceanographic data collection. Able to operate for up to months at a time, periodically communicating with a remote pilot via satellite, long-range MAS vehicles significantly reduce the need for an expensive research ship presence within the operating area. Heterogeneous fleets of MAS vehicles, operating simultaneously in an area for an extended period of time, are becoming increasingly popular due to their ability to provide an improved composite picture of the marine environment. However, at present, the expansion of the size and complexity of these multi-vehicle operations is limited by a number of factors: (1) custom control-interfaces require pilots to be trained in the use of each individual vehicle, with limited cross-platform standardization; (2) the data produced by each vehicle are typically in a custom vehicle-specific format, making the automated ingestion of observational data for near-real-time analysis and assimilation into operational ocean models very difficult; (3) the majority of MAS vehicles do not provide machine-to-machine interfaces, limiting the development and usage of common piloting tools, multi-vehicle operating strategies, autonomous control algorithms and automated data delivery. In this paper, we describe a novel piloting and data management system (C2) which provides a unified web-based infrastructure for the operation of long-range MAS vehicles within the UK's National Marine Equipment Pool. The system automates the archiving, standardization and delivery of near-real-time science data and associated metadata from the vehicles to end-users and Global Data Assembly Centers mid-mission. Through the use and promotion of standard data formats and machine interfaces throughout the C2 system, we seek to enable future opportunities to collaborate with both the marine science and robotics communities to maximize the delivery of high-quality oceanographic data for world-leading science

    Ocean data product integration through innovation-the next level of data interoperability

    Get PDF
    In the next decade the pressures on ocean systems and the communities that rely on them will increase along with impacts from the multiple stressors of climate change and human activities. Our ability to manage and sustain our oceans will depend on the data we collect and the information and knowledge derived from it. Much of the uptake of this knowledge will be outside the ocean domain, for example by policy makers, local Governments, custodians, and other organizations, so it is imperative that we democratize or open the access and use of ocean data. This paper looks at how technologies, scoped by standards, best practice and communities of practice, can be deployed to change the way that ocean data is accessed, utilized, augmented and transformed into information and knowledge. The current portal-download model which requires the user to know what data exists, where it is stored, in what format and with what processing, limits the uptake and use of ocean data. Using examples from a range of disciplines, a web services model of data and information flows is presented. A framework is described, including the systems, processes and human components, which delivers a radical rethink about the delivery of knowledge from ocean data. A series of statements describe parts of the future vision along with recommendations about how this may be achieved. The paper recommends the development of virtual test-beds for end-to-end development of new data workflows and knowledge pathways. This supports the continued development, rationalization and uptake of standards, creates a platform around which a community of practice can be developed, promotes cross discipline engagement from ocean science through to ocean policy, allows for the commercial sector, including the informatics sector, to partner in delivering outcomes and provides a focus to leverage long term sustained funding. The next 10 years will be “make or break” for many ocean systems. The decadal challenge is to develop the governance and co-operative mechanisms to harness emerging information technology to deliver on the goal of generating the information and knowledge required to sustain oceans into the future

    The Evolution of Mammalian Gene Families

    Get PDF
    Gene families are groups of homologous genes that are likely to have highly similar functions. Differences in family size due to lineage-specific gene duplication and gene loss may provide clues to the evolutionary forces that have shaped mammalian genomes. Here we analyze the gene families contained within the whole genomes of human, chimpanzee, mouse, rat, and dog. In total we find that more than half of the 9,990 families present in the mammalian common ancestor have either expanded or contracted along at least one lineage. Additionally, we find that a large number of families are completely lost from one or more mammalian genomes, and a similar number of gene families have arisen subsequent to the mammalian common ancestor. Along the lineage leading to modern humans we infer the gain of 689 genes and the loss of 86 genes since the split from chimpanzees, including changes likely driven by adaptive natural selection. Our results imply that humans and chimpanzees differ by at least 6% (1,418 of 22,000 genes) in their complement of genes, which stands in stark contrast to the oft-cited 1.5% difference between orthologous nucleotide sequences. This genomic “revolving door” of gene gain and loss represents a large number of genetic differences separating humans from our closest relatives

    Perspectives on in situ Sensors for Ocean Acidification Research

    Get PDF
    As ocean acidification (OA) sensor technology develops and improves, in situ deployment of such sensors is becoming more widespread. However, the scientific value of these data depends on the development and application of best practices for calibration, validation, and quality assurance as well as on further development and optimization of the measurement technologies themselves. Here, we summarize the results of a 2-day workshop on OA sensor best practices held in February 2018, in Victoria, British Columbia, Canada, drawing on the collective experience and perspectives of the participants. The workshop on in situ Sensors for OA Research was organized around three basic questions: 1) What are the factors limiting the precision, accuracy and reliability of sensor data? 2) What can we do to facilitate the quality assurance/quality control (QA/QC) process and optimize the utility of these data? and 3) What sort of data or metadata are needed for these data to be most useful to future users? A synthesis of the discussion of these questions among workshop participants and conclusions drawn is presented in this paper

    Argo data 1999-2019: two million temperature-salinity profiles and subsurface velocity observations from a global array of profiling floats.

    Get PDF
    © The Author(s), 2020. This article is distributed under the terms of the Creative Commons Attribution License. The definitive version was published in Wong, A. P. S., Wijffels, S. E., Riser, S. C., Pouliquen, S., Hosoda, S., Roemmich, D., Gilson, J., Johnson, G. C., Martini, K., Murphy, D. J., Scanderbeg, M., Bhaskar, T. V. S. U., Buck, J. J. H., Merceur, F., Carval, T., Maze, G., Cabanes, C., Andre, X., Poffa, N., Yashayaev, I., Barker, P. M., Guinehut, S., Belbeoch, M., Ignaszewski, M., Baringer, M. O., Schmid, C., Lyman, J. M., McTaggart, K. E., Purkey, S. G., Zilberman, N., Alkire, M. B., Swift, D., Owens, W. B., Jayne, S. R., Hersh, C., Robbins, P., West-Mack, D., Bahr, F., Yoshida, S., Sutton, P. J. H., Cancouet, R., Coatanoan, C., Dobbler, D., Juan, A. G., Gourrion, J., Kolodziejczyk, N., Bernard, V., Bourles, B., Claustre, H., D'Ortenzio, F., Le Reste, S., Le Traon, P., Rannou, J., Saout-Grit, C., Speich, S., Thierry, V., Verbrugge, N., Angel-Benavides, I. M., Klein, B., Notarstefano, G., Poulain, P., Velez-Belchi, P., Suga, T., Ando, K., Iwasaska, N., Kobayashi, T., Masuda, S., Oka, E., Sato, K., Nakamura, T., Sato, K., Takatsuki, Y., Yoshida, T., Cowley, R., Lovell, J. L., Oke, P. R., van Wijk, E. M., Carse, F., Donnelly, M., Gould, W. J., Gowers, K., King, B. A., Loch, S. G., Mowat, M., Turton, J., Rama Rao, E. P., Ravichandran, M., Freeland, H. J., Gaboury, I., Gilbert, D., Greenan, B. J. W., Ouellet, M., Ross, T., Tran, A., Dong, M., Liu, Z., Xu, J., Kang, K., Jo, H., Kim, S., & Park, H. Argo data 1999-2019: two million temperature-salinity profiles and subsurface velocity observations from a global array of profiling floats. Frontiers in Marine Science, 7, (2020): 700, doi:10.3389/fmars.2020.00700.In the past two decades, the Argo Program has collected, processed, and distributed over two million vertical profiles of temperature and salinity from the upper two kilometers of the global ocean. A similar number of subsurface velocity observations near 1,000 dbar have also been collected. This paper recounts the history of the global Argo Program, from its aspiration arising out of the World Ocean Circulation Experiment, to the development and implementation of its instrumentation and telecommunication systems, and the various technical problems encountered. We describe the Argo data system and its quality control procedures, and the gradual changes in the vertical resolution and spatial coverage of Argo data from 1999 to 2019. The accuracies of the float data have been assessed by comparison with high-quality shipboard measurements, and are concluded to be 0.002°C for temperature, 2.4 dbar for pressure, and 0.01 PSS-78 for salinity, after delayed-mode adjustments. Finally, the challenges faced by the vision of an expanding Argo Program beyond 2020 are discussed.AW, SR, and other scientists at the University of Washington (UW) were supported by the US Argo Program through the NOAA Grant NA15OAR4320063 to the Joint Institute for the Study of the Atmosphere and Ocean (JISAO) at the UW. SW and other scientists at the Woods Hole Oceanographic Institution (WHOI) were supported by the US Argo Program through the NOAA Grant NA19OAR4320074 (CINAR/WHOI Argo). The Scripps Institution of Oceanography's role in Argo was supported by the US Argo Program through the NOAA Grant NA15OAR4320071 (CIMEC). Euro-Argo scientists were supported by the Monitoring the Oceans and Climate Change with Argo (MOCCA) project, under the Grant Agreement EASME/EMFF/2015/1.2.1.1/SI2.709624 for the European Commission

    Ocean FAIR Data Services

    Get PDF
    Well-founded data management systems are of vital importance for ocean observing systems as they ensure that essential data are not only collected but also retained and made accessible for analysis and application by current and future users. Effective data management requires collaboration across activities including observations, metadata and data assembly, quality assurance and control (QA/QC), and data publication that enables local and interoperable discovery and access and secures archiving that guarantees long-term preservation. To achieve this, data should be findable, accessible, interoperable, and reusable (FAIR). Here, we outline how these principles apply to ocean data and illustrate them with a few examples. In recent decades, ocean data managers, in close collaboration with international organizations, have played an active role in the improvement of environmental data standardization, accessibility, and interoperability through different projects, enhancing access to observation data at all stages of the data life cycle and fostering the development of integrated services targeted to research, regulatory, and operational users. As ocean observing systems evolve and an increasing number of autonomous platforms and sensors are deployed, the volume and variety of data increase dramatically. For instance, there are more than 70 data catalogs that contain metadata records for the polar oceans, a situation that makes comprehensive data discovery beyond the capacity of most researchers. To better serve research, operational, and commercial users, more efficient turnaround of quality data in known formats and made available through Web services is necessary. In particular, automation of data workflows will be critical to reduce friction throughout the data value chain. Adhering to the FAIR principles with free, timely, and unrestricted access to ocean observation data is beneficial for the originators, has obvious benefits for users, and is an essential foundation for the development of new services made possible with big data technologies

    Cerebral microbleeds and intracranial haemorrhage risk in patients anticoagulated for atrial fibrillation after acute ischaemic stroke or transient ischaemic attack (CROMIS-2):a multicentre observational cohort study

    Get PDF
    Background: Cerebral microbleeds are a potential neuroimaging biomarker of cerebral small vessel diseases that are prone to intracranial bleeding. We aimed to determine whether presence of cerebral microbleeds can identify patients at high risk of symptomatic intracranial haemorrhage when anticoagulated for atrial fibrillation after recent ischaemic stroke or transient ischaemic attack. Methods: Our observational, multicentre, prospective inception cohort study recruited adults aged 18 years or older from 79 hospitals in the UK and one in the Netherlands with atrial fibrillation and recent acute ischaemic stroke or transient ischaemic attack, treated with a vitamin K antagonist or direct oral anticoagulant, and followed up for 24 months using general practitioner and patient postal questionnaires, telephone interviews, hospital visits, and National Health Service digital data on hospital admissions or death. We excluded patients if they could not undergo MRI, had a definite contraindication to anticoagulation, or had previously received therapeutic anticoagulation. The primary outcome was symptomatic intracranial haemorrhage occurring at any time before the final follow-up at 24 months. The log-rank test was used to compare rates of intracranial haemorrhage between those with and without cerebral microbleeds. We developed two prediction models using Cox regression: first, including all predictors associated with intracranial haemorrhage at the 20% level in univariable analysis; and second, including cerebral microbleed presence and HAS-BLED score. We then compared these with the HAS-BLED score alone. This study is registered with ClinicalTrials.gov, number NCT02513316. Findings: Between Aug 4, 2011, and July 31, 2015, we recruited 1490 participants of whom follow-up data were available for 1447 (97%), over a mean period of 850 days (SD 373; 3366 patient-years). The symptomatic intracranial haemorrhage rate in patients with cerebral microbleeds was 9·8 per 1000 patient-years (95% CI 4·0–20·3) compared with 2·6 per 1000 patient-years (95% CI 1·1–5·4) in those without cerebral microbleeds (adjusted hazard ratio 3·67, 95% CI 1·27–10·60). Compared with the HAS-BLED score alone (C-index 0·41, 95% CI 0·29–0·53), models including cerebral microbleeds and HAS-BLED (0·66, 0·53–0·80) and cerebral microbleeds, diabetes, anticoagulant type, and HAS-BLED (0·74, 0·60–0·88) predicted symptomatic intracranial haemorrhage significantly better (difference in C-index 0·25, 95% CI 0·07–0·43, p=0·0065; and 0·33, 0·14–0·51, p=0·00059, respectively). Interpretation: In patients with atrial fibrillation anticoagulated after recent ischaemic stroke or transient ischaemic attack, cerebral microbleed presence is independently associated with symptomatic intracranial haemorrhage risk and could be used to inform anticoagulation decisions. Large-scale collaborative observational cohort analyses are needed to refine and validate intracranial haemorrhage risk scores incorporating cerebral microbleeds to identify patients at risk of net harm from oral anticoagulation. Funding: The Stroke Association and the British Heart Foundation

    Evolving and sustaining ocean best practices and standards for the next decade

    Get PDF
    The oceans play a key role in global issues such as climate change, food security, and human health. Given their vast dimensions and internal complexity, efficient monitoring and predicting of the planet’s ocean must be a collaborative effort of both regional and global scale. A first and foremost requirement for such collaborative ocean observing is the need to follow well-defined and reproducible methods across activities: from strategies for structuring observing systems, sensor deployment and usage, and the generation of data and information products, to ethical and governance aspects when executing ocean observing. To meet the urgent, planet-wide challenges we face, methods across all aspects of ocean observing should be broadly adopted by the ocean community and, where appropriate, should evolve into “Ocean Best Practices.” While many groups have created best practices, they are scattered across the Web or buried in local repositories and many have yet to be digitized. To reduce this fragmentation, we introduce a new open access, permanent, digital repository of best practices documentation (oceanbestpractices.org) that is part of the Ocean Best Practices System (OBPS). The new OBPS provides an opportunity space for the centralized and coordinated improvement of ocean observing methods. The OBPS repository employs user-friendly software to significantly improve discovery and access to methods. The software includes advanced semantic technologies for search capabilities to enhance repository operations. In addition to the repository, the OBPS also includes a peer reviewed journal research topic, a forum for community discussion and a training activity for use of best practices. Together, these components serve to realize a core objective of the OBPS, which is to enable the ocean community to create superior methods for every activity in ocean observing from research to operations to applications that are agreed upon and broadly adopted across communities. Using selected ocean observing examples, we show how the OBPS supports this objective. This paper lays out a future vision of ocean best practices and how OBPS will contribute to improving ocean observing in the decade to come

    Finishing the euchromatic sequence of the human genome

    Get PDF
    The sequence of the human genome encodes the genetic instructions for human physiology, as well as rich information about human evolution. In 2001, the International Human Genome Sequencing Consortium reported a draft sequence of the euchromatic portion of the human genome. Since then, the international collaboration has worked to convert this draft into a genome sequence with high accuracy and nearly complete coverage. Here, we report the result of this finishing process. The current genome sequence (Build 35) contains 2.85 billion nucleotides interrupted by only 341 gaps. It covers ∼99% of the euchromatic genome and is accurate to an error rate of ∼1 event per 100,000 bases. Many of the remaining euchromatic gaps are associated with segmental duplications and will require focused work with new methods. The near-complete sequence, the first for a vertebrate, greatly improves the precision of biological analyses of the human genome including studies of gene number, birth and death. Notably, the human enome seems to encode only 20,000-25,000 protein-coding genes. The genome sequence reported here should serve as a firm foundation for biomedical research in the decades ahead
    corecore